We present a novel end-to-end neural model to extract entities and relationsbetween them. Our recurrent neural network based model captures both wordsequence and dependency tree substructure information by stacking bidirectionaltree-structured LSTM-RNNs on bidirectional sequential LSTM-RNNs. This allowsour model to jointly represent both entities and relations with sharedparameters in a single model. We further encourage detection of entities duringtraining and use of entity information in relation extraction via entitypretraining and scheduled sampling. Our model improves over thestate-of-the-art feature-based model on end-to-end relation extraction,achieving 12.1% and 5.7% relative error reductions in F1-score on ACE2005 andACE2004, respectively. We also show that our LSTM-RNN based model comparesfavorably to the state-of-the-art CNN based model (in F1-score) on nominalrelation classification (SemEval-2010 Task 8). Finally, we present an extensiveablation analysis of several model components.
展开▼